Jump to content

Tech Today: Measuring the Buzz, Hum, and Rattle


NASA

Recommended Posts

  • Publishers

2 min read

Preparations for Next Moonwalk Simulations Underway (and Underwater)

An array of microphones on the an airfield, with a sunrise in the background
The WirelessArray developed by Interdisciplinary Consulting Corporation (IC2), laid out here for a test flight at Langley Research Center, makes flight testing for drones quick and cost-effective.
Credit: NASA

Anyone who lives near an airport or is experiencing the emergence of a cicada brood can quickly identify the source of that ongoing noise. However, running tests to identify the noise created by a new drone or find pests in a field of crops requires a high-tech solution that maps sound.

With help from NASA, Interdisciplinary Consulting Corporation (IC2) introduced a new Wireless Array to do just that – anywhere, anytime. Airplanes undergo noise testing and require certification, so they don’t exceed the Federal Aviation Administration’s noise limits. Each small, saucer-shaped base, called a node, is equipped with an embedded microphone that measures the air pressure changes created by overhead sounds. For a large vehicle like an airplane, hundreds of these sensors, or microphone array, are laid out in a pattern on a runway to monitor the underside of the plane as it flies over.

Interested in making its flight tests more affordable, NASA’s Langley Research Center in Hampton, Virginia, supported the company with Small Business Innovation Research contracts and expert consulting.

“Each node contains a small computer system able to acquire and store data in memory on an SD card. It also has a small web server that allows the end user to start acquisition, stop recording, download files, check on the battery health, and more,” said Chip Patterson, vice president of IC2.

All it takes to operate an individual node or an extensive array is an off-the-shelf wireless access point and a standard laptop with IC2’s software application. The technology integrates into existing noise testing systems.

The microphone can easily be swapped for various other sensor types, like an acoustic sensor, making it possible to monitor animal noises that indicate health and well-being. An infrasonic sensor could measure the noise from supersonic aircraft, identifying the direction and arrival of a sonic boom.

This small, portable technology is finding its way into various projects and applications beyond aircraft testing. Working with an entomologist, IC2 will use acoustic data to listen for high-frequency insect sounds in agricultural settings. Discovering where insects feed on crops will allow farmers to intervene before they do too much damage while limiting pesticide use in those areas. With NASA’s help, IC2’s Wireless Array technology enables sound-based solutions in agriculture, aerospace, and beyond. 

View the full article

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      5 Min Read Reinventing the Clock: NASA’s New Tech for Space Timekeeping
      The Optical Atomic Strontium Ion Clock is a higher-precision atomic clock that is small enough to fit on a spacecraft. Credits: NASA/Matthew Kaufman Here on Earth, it might not matter if your wristwatch runs a few seconds slow. But crucial spacecraft functions need accuracy down to one billionth of a second or less. Navigating with GPS, for example, relies on precise timing signals from satellites to pinpoint locations. Three teams at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, are at work to push timekeeping for space exploration to new levels of precision.
      One team develops highly precise quantum clock synchronization techniques to aid essential spacecraft communication and navigation. Another Goddard team is working to employ the technique of clock synchronization in space-based platforms to enable telescopes to function as one enormous observatory. The third team is developing an atomic clock for spacecraft based on strontium, a metallic chemical element, to enable scientific observations not possible with current technology. The need for increasingly accurate timekeeping is why these teams at NASA Goddard, supported by the center’s Internal Research and Development program, hone clock precision and synchronization with innovative technologies like quantum and optical communications.
      Syncing Up Across the Solar System
      “Society requires clock synchronization for many crucial functions like power grid management, stock market openings, financial transactions, and much more,” said Alejandro Rodriguez Perez, a NASA Goddard researcher. “NASA uses clock synchronization to determine the position of spacecraft and set navigation parameters.”
      If you line up two clocks and sync them together, you might expect that they will tick at the same rate forever. In reality, the more time passes, the more out of sync the clocks become, especially if those clocks are on spacecraft traveling at tens of thousands of miles per hour. Rodriguez Perez seeks to develop a new way of precisely synchronizing such clocks and keeping them synced using quantum technology.
      Work on the quantum clock synchronization protocol takes place in this lab at NASA’s Goddard Space Flight Center in Greenbelt, Md.NASA/Matthew Kaufman In quantum physics, two particles are entangled when they behave like a single object and occupy two states at once. For clocks, applying quantum protocols to entangled photons could allow for a precise and secure way to sync clocks across long distances.
      The heart of the synchronization protocol is called spontaneous parametric down conversion, which is when one photon breaks apart and two new photons form. Two detectors will each analyze when the new photons appear, and the devices will apply mathematical functions to determine the offset in time between the two photons, thus synchronizing the clocks.
      While clock synchronization is currently done using GPS, this protocol could make it possible to precisely synchronize clocks in places where GPS access is limited, like the Moon or deep space.
      Syncing Clocks, Linking Telescopes to See More than Ever Before
      When it comes to astronomy, the usual rule of thumb is the bigger the telescope, the better its imagery.
      “If we could hypothetically have a telescope as big as Earth, we would have incredibly high-resolution images of space, but that’s obviously not practical,” said Guan Yang, an optical physicist at NASA Goddard. “What we can do, however, is have multiple telescopes in various locations and have each telescope record the signal with high time precision. Then we can stich their observations together and produce an ultra-high-res image.”
      The idea of linking together the observations of a network of smaller telescopes to affect the power of a larger one is called very long baseline interferometry, or VLBI.
      For VLBI to produce a whole greater than the sum of its parts, the telescopes need high-precision clocks. The telescopes record data alongside timestamps of when the data was recorded. High-powered computers assemble all the data together into one complete observation with greater detail than any one of the telescopes could achieve on its own. This technique is what allowed the Event Horizon Telescope’s network of observatories to produce the first image of a black hole at the center of our galaxy.
      The Event Horizon Telescope (EHT) — a planet-scale array of eight ground-based radio telescopes forged through international collaboration — was designed to capture images of a black hole. Although the telescopes making up the EHT are not physically connected, they are able to synchronize their recorded data with atomic clocks.EHT Collaboration Yang’s team is developing a clock technology that could be useful for missions looking to take the technique from Earth into space which could unlock many more discoveries.
      An Optical Atomic Clock Built for Space Travel
      Spacecraft navigation systems currently rely on onboard atomic clocks to obtain the most accurate time possible. Holly Leopardi, a physicist at NASA Goddard, is researching optical atomic clocks, a more precise type of atomic clock.
      While optical atomic clocks exist in laboratory settings, Leopardi and her team seek to develop a spacecraft-ready version that will provide more precision.
      The team works on OASIC, which stands for Optical Atomic Strontium Ion Clock. While current spacecraft utilize microwave frequencies, OASIC uses optical frequencies.
      The Optical Atomic Strontium Ion Clock is a higher-precision atomic clock that is small enough to fit on a spacecraft.NASA/Matthew Kaufman “Optical frequencies oscillate much faster than microwave frequencies, so we can have a much finer resolution of counts and more precise timekeeping,” Leopardi said.
      The OASIC technology is about 100 times more precise than the previous state-of-the-art in spacecraft atomic clocks. The enhanced accuracy could enable new types of science that were not previously possible.
      “When you use these ultra-high precision clocks, you can start looking at the fundamental physics changes that occur in space,” Leopardi said, “and that can help us better understand the mechanisms of our universe.”
      The timekeeping technologies unlocked by these teams, could enable new discoveries in our solar system and beyond.
      More on cutting-edge technology development at NASA Goddard By Matthew Kaufman, with additional contributions from Avery Truman
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Share
      Details
      Last Updated Sep 18, 2024 EditorRob GarnerContactRob Garnerrob.garner@nasa.govLocationGoddard Space Flight Center Related Terms
      Goddard Technology Communicating and Navigating with Missions Goddard Space Flight Center Technology View the full article
    • By NASA
      3 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      While astronaut Gene Cernan was on the lunar surface during the Apollo 17 mission, his spacesuit collected loads of lunar dust. The gray, powdery substance stuck to the fabric and entered the capsule causing eye, nose, and throat irritation dubbed “lunar hay fever.” Credit: NASACredit: NASA Moon dust, or regolith, isn’t like the particles on Earth that collect on bookshelves or tabletops – it’s abrasive and it clings to everything. Throughout NASA’s Apollo missions to the Moon, regolith posed a challenge to astronauts and valuable space hardware.

      During the Apollo 17 mission, astronaut Harrison Schmitt described his reaction to breathing in the dust as “lunar hay fever,” experiencing sneezing, watery eyes, and a sore throat. The symptoms went away, but concern for human health is a driving force behind NASA’s extensive research into all forms of lunar soil.
      The need to manage the dust to protect astronaut health and critical technology is already beneficial on Earth in the fight against air pollution.

      Working as a contributor on a habitat for NASA’s Next Space Technologies for Exploration Partnerships (NextSTEP) program, Lunar Outpost Inc. developed an air-quality sensor system to detect and measure the amount of lunar soil in the air that also detects pollutants on Earth. 

      Originally based in Denver, the Golden, Colorado-based company developed an air-quality sensor called the Space Canary and offered the sensor to Lockheed Martin Space for its NextSTEP lunar orbit habitat prototype. After the device was integrated into the habitat’s environmental control system, it provided distinct advantages over traditional equipment.

      Rebranded as Canary-S (Solar), the sensor is now meeting a need for low-cost, wireless air-quality and meteorological monitoring on Earth. The self-contained unit, powered by solar energy and a battery, transmits data using cellular technology. It can measure a variety of pollutants, including particulate matter, carbon monoxide, methane, sulfur dioxide, and volatile organic compounds, among others. The device sends a message up to a secure cloud every minute, where it’s routed to either Lunar Outpost’s web-based dashboard or a customer’s database for viewing and analysis.

      The oil and gas industry uses the Canary-S sensors to provide continuous, real-time monitoring of fugitive gas emissions, and the U.S. Forest Service uses them to monitor forest-fire emissions.

      “Firefighters have been exhibiting symptoms of carbon monoxide poisoning for decades. They thought it was just part of the job,” explained Julian Cyrus, chief operating officer of Lunar Outpost. “But the sensors revealed where and when carbon monoxide levels were sky high, making it possible to issue warnings for firefighters to take precautions.”

      The Canary-S sensors exemplify the life-saving technologies that can come from the collaboration of NASA and industry innovations. 
      Read More Share
      Details
      Last Updated Sep 17, 2024 Related Terms
      Technology Transfer & Spinoffs Spinoffs Technology Transfer Explore More
      2 min read Printed Engines Propel the Next Industrial Revolution
      Efforts to 3D print engines produce significant savings in rocketry and beyond
      Article 5 days ago 2 min read Tech Today: Flipping NASA Tech and Sticking the Landing 
      NASA tech adds gecko grip to phone accessory
      Article 1 month ago 2 min read Tech Today: Space Age Swimsuit Reduces Drag, Breaks Records
      SpeedoUSA worked with Langley Research Center to design a swimsuit with reduced surface drag.
      Article 2 months ago Keep Exploring Discover Related Topics
      Technology Transfer and Spinoffs News
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By Space Force
      DAF senior leaders focused on how the Air Force and Space Force must capitalize and leverage acceptable risk in future planning, adapt to the resourcing and risks present in today’s dynamic environment.

      View the full article
    • By NASA
      5 Min Read NASA Optical Navigation Tech Could Streamline Planetary Exploration
      Optical navigation technology could help astronauts and robots find their ways using data from cameras and other sensors. Credits: NASA As astronauts and rovers explore uncharted worlds, finding new ways of navigating these bodies is essential in the absence of traditional navigation systems like GPS. Optical navigation relying on data from cameras and other sensors can help spacecraft — and in some cases, astronauts themselves — find their way in areas that would be difficult to navigate with the naked eye. Three NASA researchers are pushing optical navigation tech further, by making cutting edge advancements in 3D environment modeling, navigation using photography, and deep learning image analysis. In a dim, barren landscape like the surface of the Moon, it can be easy to get lost. With few discernable landmarks to navigate with the naked eye, astronauts and rovers must rely on other means to plot a course.
      As NASA pursues its Moon to Mars missions, encompassing exploration of the lunar surface and the first steps on the Red Planet, finding novel and efficient ways of navigating these new terrains will be essential. That’s where optical navigation comes in — a technology that helps map out new areas using sensor data.
      NASA’s Goddard Space Flight Center in Greenbelt, Maryland, is a leading developer of optical navigation technology. For example, GIANT (the Goddard Image Analysis and Navigation Tool) helped guide the OSIRIS-REx mission to a safe sample collection at asteroid Bennu by generating 3D maps of the surface and calculating precise distances to targets.
      Now, three research teams at Goddard are pushing optical navigation technology even further.
      Virtual World Development
      Chris Gnam, an intern at NASA Goddard, leads development on a modeling engine called Vira that already renders large, 3D environments about 100 times faster than GIANT. These digital environments can be used to evaluate potential landing areas, simulate solar radiation, and more.
      While consumer-grade graphics engines, like those used for video game development, quickly render large environments, most cannot provide the detail necessary for scientific analysis. For scientists planning a planetary landing, every detail is critical.
      Vira can quickly and efficiently render an environment in great detail.NASA “Vira combines the speed and efficiency of consumer graphics modelers with the scientific accuracy of GIANT,” Gnam said. “This tool will allow scientists to quickly model complex environments like planetary surfaces.”
      The Vira modeling engine is being used to assist with the development of LuNaMaps (Lunar Navigation Maps). This project seeks to improve the quality of maps of the lunar South Pole region which are a key exploration target of NASA’s Artemis missions.
      Vira also uses ray tracing to model how light will behave in a simulated environment. While ray tracing is often used in video game development, Vira utilizes it to model solar radiation pressure, which refers to changes in momentum to a spacecraft caused by sunlight.
      Vira can accurately render indirect lighting, which is when an area is still lit up even though it is not directly facing a light source.NASA Find Your Way with a Photo
      Another team at Goddard is developing a tool to enable navigation based on images of the horizon. Andrew Liounis, an optical navigation product design lead, leads the team, working alongside NASA Interns Andrew Tennenbaum and Will Driessen, as well as Alvin Yew, the gas processing lead for NASA’s DAVINCI mission.
      An astronaut or rover using this algorithm could take one picture of the horizon, which the program would compare to a map of the explored area. The algorithm would then output the estimated location of where the photo was taken.
      Using one photo, the algorithm can output with accuracy around hundreds of feet. Current work is attempting to prove that using two or more pictures, the algorithm can pinpoint the location with accuracy around tens of feet.
      “We take the data points from the image and compare them to the data points on a map of the area,” Liounis explained. “It’s almost like how GPS uses triangulation, but instead  of having multiple observers to triangulate one object, you have multiple observations from a single observer, so we’re figuring out where the lines of sight intersect.”
      This type of technology could be useful for lunar exploration, where it is difficult to rely on GPS signals for location determination.
      A Visual Perception Algorithm to Detect Craters
      To automate optical navigation and visual perception processes, Goddard intern Timothy Chase is developing a programming tool called GAVIN (Goddard AI Verification and Integration) Tool Suit.
      This tool helps build deep learning models, a type of machine learning algorithm that is trained to process inputs like a human brain. In addition to developing the tool itself, Chase and his team are building a deep learning algorithm using GAVIN that will identify craters in poorly lit areas, such as the Moon.
      “As we’re developing GAVIN, we want to test it out,” Chase explained. “This model that will identify craters in low-light bodies will not only help us learn how to improve GAVIN, but it will also prove useful for missions like Artemis, which will see astronauts exploring the Moon’s south pole region — a dark area with large craters — for the first time.”
      As NASA continues to explore previously uncharted areas of our solar system, technologies like these could help make planetary exploration at least a little bit simpler. Whether by developing detailed 3D maps of new worlds, navigating with photos, or building deep learning algorithms, the work of these teams could bring the ease of Earth navigation to new worlds.
      By Matthew Kaufman
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Share
      Details
      Last Updated Aug 07, 2024 EditorRob GarnerContactRob Garnerrob.garner@nasa.govLocationGoddard Space Flight Center Related Terms
      Goddard Technology Artificial Intelligence (AI) Goddard Space Flight Center Technology Explore More
      4 min read NASA Improves GIANT Optical Navigation Technology for Future Missions
      Goddard's GIANT optical navigation software helped guide the OSIRIS-REx mission to the Asteroid Bennu. Today…
      Article 10 months ago 4 min read Space Station Research Contributes to Navigation Systems for Moon Voyages
      Article 2 years ago 5 min read NASA, Industry Improve Lidars for Exploration, Science
      NASA engineers will test a suite of new laser technologies from an aircraft this summer…
      Article 5 months ago View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Akeem Shannon showcasing Flipstik attached to a smartphone. The product’s design was improved by looking at NASA research to inform its gecko-inspired method of adhering to surfacesCredit: Flipstik Inc. When it comes to innovative technologies, inventors often find inspiration in the most unexpected places. A former salesman, Akeem Shannon, was inspired by his uncle, who worked as an engineer at NASA’s Marshall Space Flight Center in Huntsville, Alabama, to research the agency’s published technologies. He came across a sticky NASA invention that would help him launch his breakout product.  

      In the early 2010s, a team of roboticists at NASA’s Jet Propulsion Laboratory in Southern California were exploring methods to enhance robots’ gripping capabilities. They came across the Van Der Waals force – a weak electrostatic bond that forms at the molecular level when points on two surfaces make contact. This is the same force that geckos use to climb along walls.  

      Much like a gecko’s foot, this apparatus developed at the Jet Propulsion Laboratory uses tiny fibers to grip objects and hold them tight. This work later inspired and informed the development of Flipstik.Credit: NASA The microscopic hairs on gecko toe pads are called setae, which gives the technology the nickname of “synthetic setae.” While Shannon couldn’t use this NASA technology to hang a TV on a wall, he saw a way to mount a much smaller screen – a cellphone. 

      A synthetic setae attachment on a cellphone case could stick to most surfaces, such as mirrors or the back of airplane seats. With a product design in hand, Shannon founded St. Louis-based Flipstik Inc. in 2018. Shannon wanted to make a reliable product that could be used multiple times in various situations. He said the published NASA research, which describes methods of molding and casting the tiny hairs to be more durable, was indispensable to making his product portable and reusable. 

      Flipstik has made an impact on the mobile device industry. In addition to people using it to mount their phones to watch videos, it has become popular among content creators to capture camera angles. Flipstik also allows deaf users to keep their hands free, enabling them to make video calls in sign language. From geckos to NASA research, Shannon’s innovation is a reminder that inspiration can come from anywhere. 

      Read More Share
      Details
      Last Updated Aug 06, 2024 Related Terms
      Technology Transfer & Spinoffs Jet Propulsion Laboratory Robotics Spinoffs Technology Transfer Explore More
      6 min read Quantum Scale Sensors used to Measure Planetary Scale Magnetic Fields
      Magnetic fields are everywhere in our solar system. They originate from the Sun, planets, and…
      Article 1 hour ago 4 min read AstroViz: Iconic Pillars of Creation Star in NASA’s New 3D Visualization
      NASA’s Universe of Learning – a partnership among the Space Telescope Science Institute (STScI), Caltech/IPAC,…
      Article 20 hours ago 7 min read NASA’s Perseverance Rover Scientists Find Intriguing Mars Rock
      Article 2 weeks ago Keep Exploring Discover Related Topics
      Robotics
      Jet Propulsion Laboratory
      Technology Transfer & Spinoffs
      Technology
      View the full article
  • Check out these Videos

×
×
  • Create New...